Chernoff Bounds, and Some Applications

نویسندگان

  • Michel Goemans
  • Lorenzo Orecchia
چکیده

Preliminaries Before we venture into Chernoff bound, let us recall two simple bounds on the probability that a random variable deviates from the mean by a certain amount: Markov's inequality and Chebyshev's inequality. Markov's inequality only applies to non-negative random variables and gives us a bound depending on the expectation of the random variable.

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

Geometric Applications of Chernoff-type Estimates

In this paper we present a probabilistic approach to some geometric problems in asymptotic convex geometry. The aim of this paper is to demonstrate that the well known Chernoff bounds from probability theory can be used in a geometric context for a very broad spectrum of problems, and lead to new and improved results. We begin by briefly describing Chernoff bounds, and the way we will use them....

متن کامل

I, '- ' '-.... '. ' 'g S0:.~:~f0ff;::f 0;0AC

The Chernoff bounds provide very accurate information concerning the tail probabilities of most distrbhutions. in this paper we describe some relevant properties of these bounds.

متن کامل

IRWIN AND JOAN JACOBS CENTER FOR COMMUNICATION AND INFORMATION TECHNOLOGIES On Improved Bounds for Probability Metrics and f- Divergences

Derivation of tight bounds for probability metrics and f -divergences is of interest in information theory and statistics. This paper provides elementary proofs that lead, in some cases, to significant improvements over existing bounds; they also lead to the derivation of some existing bounds in a simplified way. The inequalities derived in this paper relate between the Bhattacharyya parameter,...

متن کامل

On Improved Bounds for Probability Metrics and $f$-Divergences

Derivation of tight bounds for probability metrics and f -divergences is of interest in information theory and statistics. This paper provides elementary proofs that lead, in some cases, to significant improvements over existing bounds; they also lead to the derivation of some existing bounds in a simplified way. The inequalities derived in this paper relate between the Bhattacharyya parameter,...

متن کامل

CS 174 Lecture 10 John Canny

But we already saw that some random variables (e.g. the number of balls in a bin) fall off exponentially with distance from the mean. So Markov and Chebyshev are very poor bounds for those kinds of random variables. The Chernoff bound applies to a class of random variables and does give exponential fall-off of probability with distance from the mean. The critical condition that’s needed for a C...

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

عنوان ژورنال:

دوره   شماره 

صفحات  -

تاریخ انتشار 2014